Guild Wars Forums - GW Guru
 
 

Go Back   Guild Wars Forums - GW Guru > Forest of True Sight > Technician's Corner

Notices

Reply
 
Thread Tools Display Modes
Old Nov 04, 2005, 04:51 AM // 04:51   #21
Krytan Explorer
 
Join Date: Aug 2005
Guild: Elite Black Ops
Profession: W/Mo
Advertisement

Disable Ads
Default

what is overclocking i cant find out anywhere, whats it mean? whats it do
A NERD1989 is offline   Reply With Quote
Old Nov 04, 2005, 06:25 AM // 06:25   #22
Frost Gate Guardian
 
swaaye's Avatar
 
Join Date: May 2005
Default

I wouldn't put X800GT in there. It's an 8-pipe card that costs almost the same as the 12-pipe X800GTO (stupid naming!!).

Couple of other ideas:

Add links to these excellent card roundups:
Tom's Hardware's VGA Charts Summer 2005 PCIe
Tom's Hardware's VGA Charts Summer 2005 AGP
Beyond3D's fantastic board/chip info chart <-- This is bar-none the best place to find out what you actually are buying or have already. Incredible.

Also, please tell people that they should NOT buy Geforce FX cards. As in any NV card with FX in the title.

My Radeon 8500 can run Guild Wars at 1680x1050 high quality. The post processing effects should be turned off though. But 8500 can definitely do a lot better than low quality. I tried GW on a Radeon 7200 PCI and a Geforce 2 MX200 (lol) and those cards had to run about 800x600 low quality.

Last edited by swaaye; Nov 04, 2005 at 06:27 PM // 18:27..
swaaye is offline   Reply With Quote
Old Nov 04, 2005, 07:42 PM // 19:42   #23
Frost Gate Guardian
 
Techie's Avatar
 
Join Date: Nov 2005
Location: Fairfield, Ohio
Profession: Mo/W
Default

Ok I will add the following, and the 8500 correction.
Techie is offline   Reply With Quote
Old Nov 04, 2005, 11:52 PM // 23:52   #24
Furnace Stoker
 
lord_shar's Avatar
 
Join Date: Jul 2005
Location: near SF, CA
Default

I'm still not sure why the 6800-Ultra isn't even mentiond in the full-eye-candy card list. The reason can't be price since the 7800GTX is also listed. It is noticibly faster than my 6800GT in the same box (yes, I have both) and is still one of the quickest AGP solutions available.

I also agree with Swaaye's previous post mentioning that the FX-series is utter trash...

Last edited by lord_shar; Nov 04, 2005 at 11:56 PM // 23:56..
lord_shar is offline   Reply With Quote
Old Nov 05, 2005, 11:39 AM // 11:39   #25
Middle-Age-Man
 
Old Dood's Avatar
 
Join Date: May 2005
Location: Lansing, Mi
Profession: W/Mo
Default

The November issue of Millinum PC magazine has a Nice Video Card list. Breaks it down into price groups. Good review. The October issue has a decent How To build a Computer article.
Old Dood is offline   Reply With Quote
Old Nov 05, 2005, 02:40 PM // 14:40   #26
Furnace Stoker
 
EternalTempest's Avatar
 
Join Date: Jun 2005
Location: United States
Guild: Dark Side Ofthe Moon [DSM]
Profession: E/
Default

Quote:
Originally Posted by swaaye
Also, please tell people that they should NOT buy Geforce FX cards. As in any NV card with FX in the title.
Quote:
Originally Posted by lord_shar
I also agree with Swaaye's previous post mentioning that the FX-series is utter trash...
My Geforce FX 5700 Ultra 128mb does very well and plays GW great as well as many other games. I agree there is no reason to go with an FX card now, but this is due to the Geforce 6x series out and the 6600 being extremely affordable and an agp version of it as well. I do admit that latter cards were much better but at the time it was a positive step up from the Geforce 4 series and the 6xxx series did not exist.
EternalTempest is offline   Reply With Quote
Old Nov 06, 2005, 04:03 AM // 04:03   #27
Frost Gate Guardian
 
swaaye's Avatar
 
Join Date: May 2005
Default

If it runs well I'm quite happy for you. Guild Wars is not pixel shader heavy and that is why. In fact I believe only the post-processing effect uses a shader.

It is not debatable that the Geforce FX series has absolutely terrible pixel shading performance. The 4 year old Radeon 9700, which came out even before the first GeforceFX card, is several times faster than the fastest GeforceFX at pure shading. Most games still are not very heavy on shading (undoubtedly because of the userbase of cards with limited shading capability). However games like FarCry, Everquest 2, Half Life 2, FEAR, and others like them will run EXTREMELY poorly unless the game has a specifically optimized path for GeforceFX.

The biggest reason R300 (9700) is faster than NV30 (FX) is that 9700 is really a more standard GPU design than NV30 was. nVidia created a pretty unique chip, one that changed the way graphics pipelines worked, and it was really ahead of its time. This unfortunately carried with it serious performance challenges though. ATI played it safe while NV played for the future. Neither was a better decision really, but for performance ATI definitely won out. 9700 was an absolute beast for the time. NV3x has less processing resources for DX9 effects than R300 by a long shot, and NV3x requires advanced compiler technology in the driver along with proper utilization by the game to get decent performance. R300 was a LOT easier to work with.

Half Life 2, for example, actually runs FX cards in DirectX 8 mode because although they have "full" DX9 support, the performance is just too terrible for them to run at that level, even with the developers attempting to tweak the game specifically for these cards.

The GeforceFX is basically a highly tuned DirectX 8 level card with DX9 shading capabilites there as features, but not with viable performance. The Geforce 4 Ti series is actually faster than most Geforce FX cards, but your 5700 probably is faster than or at least equal to a Ti4600. A Geforce FX5600 or lesser card is not faster. I had a friend upgrade from a Ti4400 once to a FX5600 Ultra and he told me flat out that he was quite disappointed.

Last edited by swaaye; Nov 06, 2005 at 04:12 AM // 04:12..
swaaye is offline   Reply With Quote
Old Nov 06, 2005, 04:40 AM // 04:40   #28
Furnace Stoker
 
EternalTempest's Avatar
 
Join Date: Jun 2005
Location: United States
Guild: Dark Side Ofthe Moon [DSM]
Profession: E/
Default

The reason why I responded the way I did was due to the "harsh" language used to slam the FX.

I grant you that the Ati 96xx+ and up gen was better then the Nvidia Geforce FX 5700 ultra at that time and the pixel shader was much better in that battle then the Nvidia FX pixel shader at that time.

The situation had flipped to Nvidia 6 and 7 series better then current Ati, and the upper end brand new ati stuff just know coming out is on par, slightly better in some cases. In 6 months who knows, it will be next gen Ati vs next gen Nvidia and I will be reviewing many different web sites and reviews on them to see what is better.

Quote:
However games like FarCry, Everquest 2, Half Life 2, FEAR, and others like them will run EXTREMELY poorly unless the game has a specifically optimized path for GeforceFX.
A lot of games will drop to dx8 mode due to the sheer number of cards out there that don't support full dx9. I do own Farcry, HL2, and Doom3 - they all play on at least medium or higher settings, look great and are very playable. I also play with them having EAX turned on too (doom 3 got the support from the most current patch + having the openAL drivers installed).

I'm also aware that the FX line is a bit out of date for next gen games coming out now. It's two geneartions behind. I will be picking up Fear very soon and expect to play it on low to med settings.

Last edited by EternalTempest; Nov 06, 2005 at 04:43 AM // 04:43..
EternalTempest is offline   Reply With Quote
Old Nov 06, 2005, 05:06 AM // 05:06   #29
Furnace Stoker
 
lord_shar's Avatar
 
Join Date: Jul 2005
Location: near SF, CA
Default

Quote:
Originally Posted by EternalTempest
The reason why I responded the way I did was due to the "harsh" language used to slam the FX.

...<SNIP>...
Sorry about that -- I'll keep the rhetoric more toned down next time.

I tend to favor NVidia cards over ATI. I've owned every generation of GeForce GPU until the FX5000 line, which I completely skipped due to its serious performance issues. My current graphics card is a 6800 Ultra. The 7800 GTX is my next planned video card, but I haven't purchased it yet since I'm trying to decide between new PCIe-equipped desktop vs. laptop (Dell M170 w/ Nvidia 7800-Go GTX).
lord_shar is offline   Reply With Quote
Old Nov 06, 2005, 05:44 AM // 05:44   #30
Academy Page
 
Join Date: May 2005
Location: Newport News Va
Guild: Unknown Warriors of Ascalon
Profession: W/R
Default

I got some odd problem I have a desktop PC with 768 mb sys ram and a nvidia 5200fx 128mb ram card my cpu is 3.0ghz but my game locks up when I open my inventory and the textures flicker quite a bit ingame but my Laptop with a 2.8ghz processor 512sys ram and an ATI9200 64mb video card runs perfectly on high settings I really want to play GW on my desktop pc but I can't deal with the game locking up the way it does so I'm forced to play on an inferior laptop because it runs the game better.

I don't know why this is happening but it shouldn't be since my video card is better in my desktop and there's more sys ram and cpu power. you guys may say that even though ATI supports the game that there won't be any difference with nvidia cards but I've noticed otherwise and it's really frustrating because I've had the exact opposite problem with BF2 which is why I took an ATI card out of my desktop and replaced it with the nvidia for now I'll stick to the laptop until I can get a realy good video card from ATI which will run all my games properly
Ironsword is offline   Reply With Quote
Old Nov 06, 2005, 05:49 AM // 05:49   #31
Furnace Stoker
 
lord_shar's Avatar
 
Join Date: Jul 2005
Location: near SF, CA
Default

Quote:
Originally Posted by Ironsword
I got some odd problem I have a desktop PC with 768 mb sys ram and a nvidia 5200fx 128mb ram card my cpu is 3.0ghz but my game locks up when I open my inventory and the textures flicker quite a bit ingame but my Laptop with a 2.8ghz processor 512sys ram and an ATI9200 64mb video card runs perfectly on high settings I really want to play GW on my desktop pc but I can't deal with the game locking up the way it does so I'm forced to play on an inferior laptop because it runs the game better.

I don't know why this is happening but it shouldn't be since my video card is better in my desktop and there's more sys ram and cpu power. you guys may say that even though ATI supports the game that there won't be any difference with nvidia cards but I've noticed otherwise and it's really frustrating because I've had the exact opposite problem with BF2 which is why I took an ATI card out of my desktop and replaced it with the nvidia for now I'll stick to the laptop until I can get a realy good video card from ATI which will run all my games properly
If you've swapped video cards with different GPU brands on your desktop, then you might have mixed ATI + NVidia drivers loaded in memory. There are a couple of video driver cleaner utilities out there, but I can't think of any names at the moment.

Also, the latest NVidia WHQL drivers have a known texture corruption bug in GW. Load the previous NVidia driver set should correct this.
lord_shar is offline   Reply With Quote
Old Nov 06, 2005, 08:38 PM // 20:38   #32
Frost Gate Guardian
 
Techie's Avatar
 
Join Date: Nov 2005
Location: Fairfield, Ohio
Profession: Mo/W
Default

Well oddly enough, DriverCleaner is the name of a great program :O

Google it and check it out.

And I will add 6800 Ultra, but seriously for that dough just get a 7800GTX.
Techie is offline   Reply With Quote
Old Nov 06, 2005, 08:58 PM // 20:58   #33
Furnace Stoker
 
lord_shar's Avatar
 
Join Date: Jul 2005
Location: near SF, CA
Default

Quote:
Originally Posted by Techie
...<SNIP>...

And I will add 6800 Ultra, but seriously for that dough just get a 7800GTX.
I agree, but the 7800GTX is only available for PCIe slots. Older motherboards will only have AGP slots available, so for an older system, the 7800-series isn't an option. And yes, the 6800Ultra can run full eye candy at all resolutions.

Last edited by lord_shar; Nov 06, 2005 at 09:00 PM // 21:00..
lord_shar is offline   Reply With Quote
Old Nov 06, 2005, 11:31 PM // 23:31   #34
Frost Gate Guardian
 
Techie's Avatar
 
Join Date: Nov 2005
Location: Fairfield, Ohio
Profession: Mo/W
Default

Yes I added it. It is great for running games at high resolutions, I will agree with you on that. And yes the fact it is AGP is another bonus. I guess I could recommend it if you want full eyecandy at the highest or close to highest resolutions.
Techie is offline   Reply With Quote
Old Nov 07, 2005, 09:12 PM // 21:12   #35
Furnace Stoker
 
EternalTempest's Avatar
 
Join Date: Jun 2005
Location: United States
Guild: Dark Side Ofthe Moon [DSM]
Profession: E/
Default

Nvidia just added the Geforce 6800 GS to there line up.
The 68xx series cards from worst to best in the series.

GeForce 6800 LE -> GeForce 6800 -> **GeForce 6800 GS** -> GeForce 6800 GT -> GeForce 6800 Ultra.

Also read how the 6800 Ultra is no longer being made.
http://www.guru3d.com/article/Videocards/278/

Last edited by EternalTempest; Nov 07, 2005 at 09:14 PM // 21:14..
EternalTempest is offline   Reply With Quote
Old Nov 07, 2005, 09:38 PM // 21:38   #36
Furnace Stoker
 
lord_shar's Avatar
 
Join Date: Jul 2005
Location: near SF, CA
Default

Strange, they have two listed 7800GTX, but the 512MB version has a much higher GPU + memory clock!
lord_shar is offline   Reply With Quote
Old Nov 07, 2005, 10:53 PM // 22:53   #37
Furnace Stoker
 
EternalTempest's Avatar
 
Join Date: Jun 2005
Location: United States
Guild: Dark Side Ofthe Moon [DSM]
Profession: E/
Default

Quote:
Originally Posted by lord_shar
Strange, they have two listed 7800GTX, but the 512MB version has a much higher GPU + memory clock!
I didn't notice it either untill you pointed it out. Something else in the conclusion, that caught my eye this card beats the yet to be released ATI x1600 XT at the same price point. The only think I could not find if the new card is Agp & PCIexpreess or Pciexpress only. I'm pretty sure they tweaked the core gpu to tune it better the the existing 6x series cards.
EternalTempest is offline   Reply With Quote
Old Nov 08, 2005, 02:51 AM // 02:51   #38
Frost Gate Guardian
 
Techie's Avatar
 
Join Date: Nov 2005
Location: Fairfield, Ohio
Profession: Mo/W
Default

The x1600XT will be a highly talked-about card once released, I might actually be able to test one depending.

Oh and the 6800LE is a superb card for unlocking pipes and shaders. Leadtek's stock fan can do wonders.
Techie is offline   Reply With Quote
Old Nov 08, 2005, 10:07 AM // 10:07   #39
Jungle Guide
 
Join Date: May 2005
Default

Quote:
Originally Posted by EternalTempest

The situation had flipped to Nvidia 6 and 7 series better then current Ati, and the upper end brand new ati stuff just know coming out is on par, slightly better in some cases. In 6 months who knows, it will be next gen Ati vs next gen Nvidia and I will be reviewing many different web sites and reviews on them to see what is better.

Comparing Nvidia 6000 series and X800 series from ATI, Nvidia isn't better. The Nvidia card 6800 Ultra cannot match the ATI X850 XTPE when AA/AF are notched up. Except for SM3.0 support, which arguably favours Nvidia over ATI's SM2.0b, the Nvidias aren't better.

The Nvidia 7000 series is meant to be compared with ATI's X1800 series. The jury is still out on this...
MaglorD is offline   Reply With Quote
Old Nov 08, 2005, 11:42 AM // 11:42   #40
Furnace Stoker
 
lord_shar's Avatar
 
Join Date: Jul 2005
Location: near SF, CA
Default

Quote:
Originally Posted by MaglorD
Comparing Nvidia 6000 series and X800 series from ATI, Nvidia isn't better. The Nvidia card 6800 Ultra cannot match the ATI X850 XTPE when AA/AF are notched up. Except for SM3.0 support, which arguably favours Nvidia over ATI's SM2.0b, the Nvidias aren't better.
ATI's benchmarks with the x800's and x850's are a bit skewed due to their catalyst driver's dynamic texture filtering. Also referred to as "bri-linear filtering," ATI's x800 series dynamically shifts between bilinear + trilinear filtering modes to achieve the best benchmarks. However, the X800's/850's could not perform true trilinear filtering unless you turn off ATI's filtering optimizations, but once you disabled this feature, ATI's X800's/850's fell behind NVidia's 6800's. ATI caught a lot of flak for the above and finally conceded by adding an "off" switch to their optimized texture filtering.

Why does the above matter? Simple: ATI was compromising video quality for the sake of benchmarks. NVidia did this in the past as well with their fx5000 series, so they're not squeaky-clean either. However, neither company should be resorting to such driver tweaks given the speed of their current card lines.

Drivers can always be updated, but you're stuck with the video card until you toss it, so you might was well get the best hardware possible until the next best thing comes out.

Quote:
Originally Posted by MaglorD
The Nvidia 7000 series is meant to be compared with ATI's X1800 series. The jury is still out on this...
Actually, it's the other way around -- NVidia dropped a bombshell with the 7800GTX's release and immediate availability in mass volume. ATI could not immediately counter this move, so here we are 5 months later and just seeing the X1800's going through the usual trickle retail release.

ATI's X1800 series will be faster in some games (e.g.,Far Cry), and slower in others (Quake3-4, Doom3, etc). So it looks like equilibrium will be restored between the two competitors, which is good news for consumers like us.

Last edited by lord_shar; Nov 08, 2005 at 11:51 AM // 11:51..
lord_shar is offline   Reply With Quote
Reply

Share This Forum!  
 
 
           

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Valerius Community Works 206 Apr 10, 2006 03:44 AM // 03:44
The Ultimate Armor Guide Justin_GW Screenshot Exposition 16 Jan 07, 2006 05:32 PM // 17:32
Dragon Incarnate Community Works 73 Oct 10, 2005 08:15 AM // 08:15
Legendary Battousai Gladiator's Arena 77 Sep 25, 2005 06:23 AM // 06:23


All times are GMT. The time now is 10:01 AM // 10:01.


Powered by: vBulletin
Copyright ©2000 - 2016, Jelsoft Enterprises Ltd.
jQuery(document).ready(checkAds()); function checkAds(){if (document.getElementById('adsense')!=undefined){document.write("_gaq.push(['_trackEvent', 'Adblock', 'Unblocked', 'false',,true]);");}else{document.write("